Nonlinear residual minimization by iteratively reweighted least squares

نویسنده

  • Juliane Sigl
چکیده

In this paper we address the numerical solution of minimal norm residuals of nonlinear equations in finite dimensions. We take particularly inspiration from the problem of finding a sparse vector solution of phase retrieval problems by using greedy algorithms based on iterative residual minimizations in the `p-norm, for 1 ≤ p ≤ 2. Due to the mild smoothness of the problem, especially for p→ 1, we develop and analyze a generalized version of Iteratively Reweighted Least Squares (IRLS). This simple and efficient algorithm performs the solution of optimization problems involving non-quadratic possibly non-convex and non-smooth cost functions, which can be transformed into a sequence of common least squares problems, to be tackled eventually by more efficient numerical optimization methods. While its analysis has been by now developed in many different contexts (e.g., for sparse vector, low-rank matrix optimization, and for the solution of PDE involving p-Laplacians) when the model equation is linear, no results are up to now provided in case of nonlinear ones. We address here precisely the convergence and the rate of error decay of IRLS for such nonlinear problems. The analysis of the convergence of the algorithm is based on its reformulation as an alternating minimization of an energy functional, whose main variables are the competitors to solutions of the intermediate reweighted least squares problems and their weights. Under a specific condition of coercivity often verified in practice and assumptions of local convexity, we are able to show convergence of IRLS to minimizers of the nonlinear residual problem. For the case where we are lacking the local convexity, we propose an appropriate convexification by quadratic perturbations. Eventually we are able to show convergence of this modified procedure to at least a very good approximation of stationary points of the original problem. In order to illustrate the theoretical results we conclude the paper with several numerical experiments. We compare IRLS with standard Matlab optimization functions for a simple and easily presentable example and furthermore numerically validate our theoretical results in the more complicated framework of phase retrieval problems, which are our main motivation. Finally we examine the recovery capability of the algorithm in the context of data corrupted by impulsive noise where the sparsification of the residual is desired.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Semi-calibrated Near-Light Photometric Stereo

We tackle the nonlinear problem of photometric stereo under close-range pointwise sources, when the intensities of the sources are unknown (so-called semi-calibrated setup). A variational approach aiming at robust joint recovery of depth, albedo and intensities is proposed. The resulting nonconvex model is numerically resolved by a provably convergent alternating minimization scheme, where the ...

متن کامل

Improved Iteratively Reweighted Least Squares for Unconstrained

In this paper, we first study q minimization and its associated iterative reweighted algorithm for recovering sparse vectors. Unlike most existing work, we focus on unconstrained q minimization, for which we show a few advantages on noisy measurements and/or approximately sparse vectors. Inspired by the results in [Daubechies et al., Comm. Pure Appl. Math., 63 (2010), pp. 1–38] for constrained ...

متن کامل

A comparison of typical ℓp minimization algorithms

Recently, compressed sensing has been widely applied to various areas such as signal processing, machine learning, and pattern recognition. To find the sparse representation of a vector w.r.t. a dictionary, an l1 minimization problem, which is convex, is usually solved in order to overcome the computational difficulty. However, to guarantee that the l1 minimizer is close to the sparsest solutio...

متن کامل

A comparison of the computational performance of Iteratively Reweighted Least Squares and alternating minimization algorithms for ℓ1 inverse problems

Alternating minimization algorithms with a shrinkage step, derived within the Split Bregman (SB) or Alternating Direction Method of Multipliers (ADMM) frameworks, have become very popular for `-regularized problems, including Total Variation and Basis Pursuit Denoising. It appears to be generally assumed that they deliver much better computational performance than older methods such as Iterativ...

متن کامل

Whitening track residuals with PEFs in IRLS approach to the Sea of Galilee

We applied an Iteratively Reweighted Least Squares (IRLS) approach to create a map of the Sea of Galilee. We use a bank of Prediction Error Filters (PEFs) as a residual whitener to reduce the acquisition footprint and map artifacts caused by non-Gaussian noise in the data.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 64  شماره 

صفحات  -

تاریخ انتشار 2016